We will continue with the dog breeds model you built over lunch. In this notebook, we will:
1. Deploy the model
2. Do a remote prediction from a python backend
3. Do a remote prediction from a JS frontend
You will need the Model ID for the model you built over lunch. In pantheon, goto:
1. Artifical Intelligence => Vision
2. From Vision tab, Select AutoML Classification
3. From AutoML Classification tab, select Models
4. Find your model (i.e., Dog Breeds) and verify status is 'Training Successful'
In [ ]:
# Set your project ID
PROJECT_ID = "[your-project-id]" #@param {type:"string"}
!gcloud config set project $PROJECT_ID
# This is the default, don't change it
COMPUTE_REGION="us-central1"
import sys
# If you are running this notebook in Colab, run this cell and follow the
# instructions to authenticate your GCP account. This provides access to your
# Cloud Storage bucket and lets you submit training jobs and prediction
# requests.
if 'google.colab' in sys.modules:
from google.colab import auth as google_auth
google_auth.authenticate_user()
# If you are running this notebook locally, replace the string below with the
# path to your service account key and run this cell to authenticate your GCP
# account.
else:
%env GOOGLE_APPLICATION_CREDENTIALS your_path_to_credentials.json
In [ ]:
import tensorflow as tf
import numpy as np
# import the Google AutoML client library
from google.cloud import automl_v1beta1 as automl
In [ ]:
# Create an AutoML client
client = automl.AutoMlClient()
# Derive the full GCP path to the project
project_location = client.location_path(PROJECT_ID, COMPUTE_REGION)
In [ ]:
# Set your model ID here
model_id="[my-model-id]"
# Get the full path of the model.
model_full_id = client.model_path(PROJECT_ID, COMPUTE_REGION, model_id)
In [ ]:
# Deploy the model.
response = client.deploy_model(model_full_id)
# Wait for deployment to complete
print(response.result)
In [ ]:
IMAGE_PATH="gs://" + PROJECT_ID + "-vcm/dog_breeds/fff43b07992508bc822f33d8ffd902ae.jpg"
!gsutil cp $IMAGE_PATH ./
In [ ]:
# Specify a path to an Image
IMAGE_PATH="fff43b07992508bc822f33d8ffd902ae.jpg"
# Get the full path of the model.
model_full_id = client.model_path(PROJECT_ID, COMPUTE_REGION, model_id)
# Create client for prediction service.
prediction_client = automl.PredictionServiceClient()
# Read the image and assign to payload.
with open(IMAGE_PATH, 'rb') as image_file:
content = image_file.read()
payload = {"image": {"image_bytes": content}}
# params is additional domain-specific parameters.
# score_threshold is used to filter the result
# Initialize params
params = {}
score_threshold = None
if score_threshold:
params = {"score_threshold": score_threshold}
response = prediction_client.predict(model_full_id, payload, params)
print("Prediction results:")
for result in response.payload:
print("Predicted class name: {}".format(result.display_name))
print("Predicted class score: {}".format(result.classification.score))
In [ ]:
!curl -X POST \
-H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
-H "Content-Type: application/json" \
https://automl.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/us-central1/models/${model_id}:predict \
-d '{ "payload" : { "image": { "imageBytes" : "/9j/4AAQSkZJRgABAQAAAQ … "},}}'
In [ ]:
# Undeploy the model.
response = client.deploy_model(model_full_id)
# Wait for undeployment to complete
print(response.result)
To clean up all GCP resources used in this project, you can delete the GCP project you used for the tutorial.